690 research outputs found

    Coulomb Drag and Magnetotransport in Graphene Double Layers

    Full text link
    We review the fabrication and key transport properties of graphene double layers, consisting of two graphene monolayers placed in close proximity, independently contacted, and separated by an ultra-thin dielectric. We outline a simple band structure model relating the layer densities to the applied gate and inter-layer biases, and show that calculations and experimental results are in excellent agreement both at zero and in high magnetic fields. Coulomb drag measurements, which probe the electron-electron scattering between the two layers reveal two distinct regime: (i) diffusive drag at elevated temperatures, and (ii) mesoscopic fluctuation-dominated drag at low temperatures. We discuss the Coulomb drag results within the framework of existing theories.Comment: 9 pages, 6 figure

    Tree-guided group lasso for multi-response regression with structured sparsity, with an application to eQTL mapping

    Full text link
    We consider the problem of estimating a sparse multi-response regression function, with an application to expression quantitative trait locus (eQTL) mapping, where the goal is to discover genetic variations that influence gene-expression levels. In particular, we investigate a shrinkage technique capable of capturing a given hierarchical structure over the responses, such as a hierarchical clustering tree with leaf nodes for responses and internal nodes for clusters of related responses at multiple granularity, and we seek to leverage this structure to recover covariates relevant to each hierarchically-defined cluster of responses. We propose a tree-guided group lasso, or tree lasso, for estimating such structured sparsity under multi-response regression by employing a novel penalty function constructed from the tree. We describe a systematic weighting scheme for the overlapping groups in the tree-penalty such that each regression coefficient is penalized in a balanced manner despite the inhomogeneous multiplicity of group memberships of the regression coefficients due to overlaps among groups. For efficient optimization, we employ a smoothing proximal gradient method that was originally developed for a general class of structured-sparsity-inducing penalties. Using simulated and yeast data sets, we demonstrate that our method shows a superior performance in terms of both prediction errors and recovery of true sparsity patterns, compared to other methods for learning a multivariate-response regression.Comment: Published in at http://dx.doi.org/10.1214/12-AOAS549 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Spin-Polarized to Valley-Polarized Transition in Graphene Bilayers at ν=0\nu=0 in High Magnetic Fields

    Full text link
    We investigate the transverse electric field (EE) dependence of the ν\nu=0 quantum Hall state (QHS) in dual-gated graphene bilayers in high magnetic fields. The longitudinal resistivity (ρxx\rho_{xx}) measured at ν\nu=0 shows an insulating behavior which is strongest in the vicinity of EE=0, and at large EE-fields. At a fixed perpendicular magnetic field (BB), the ν\nu=0 QHS undergoes a transition as a function of EE, marked by a minimum, temperature-independent ρxx\rho_{xx}. This observation is explained by a transition from a spin polarized ν\nu=0 QHS at small EE-fields, to a valley (layer) polarized ν\nu=0 QHS at large EE-fields. The EE-field value at which the transition occurs has a linear dependence on BBComment: 5 pages, 5 figure

    Smoothing Proximal Gradient Method for General Structured Sparse Learning

    Full text link
    We study the problem of learning high dimensional regression models regularized by a structured-sparsity-inducing penalty that encodes prior structural information on either input or output sides. We consider two widely adopted types of such penalties as our motivating examples: 1) overlapping group lasso penalty, based on the l1/l2 mixed-norm penalty, and 2) graph-guided fusion penalty. For both types of penalties, due to their non-separability, developing an efficient optimization method has remained a challenging problem. In this paper, we propose a general optimization approach, called smoothing proximal gradient method, which can solve the structured sparse regression problems with a smooth convex loss and a wide spectrum of structured-sparsity-inducing penalties. Our approach is based on a general smoothing technique of Nesterov. It achieves a convergence rate faster than the standard first-order method, subgradient method, and is much more scalable than the most widely used interior-point method. Numerical results are reported to demonstrate the efficiency and scalability of the proposed method.Comment: arXiv admin note: substantial text overlap with arXiv:1005.471

    Coulomb Drag of Massless Fermions in Graphene

    Full text link
    Using a novel structure, consisting of two, independently contacted graphene single layers separated by an ultra-thin dielectric, we experimentally measure the Coulomb drag of massless fermions in graphene. At temperatures higher than 50 K, the Coulomb drag follows a temperature and carrier density dependence consistent with the Fermi liquid regime. As the temperature is reduced, the Coulomb drag exhibits giant fluctuations with an increasing amplitude, thanks to the interplay between coherent transport in the graphene layer and interaction between the two layers.Comment: 5 pages, 5 figure
    corecore